Minimizing Nonconvex Non-Separable Functions

نویسندگان

  • Yaoliang Yu
  • Xun Zheng
  • Micol Marchetti-Bowick
  • Eric P. Xing
چکیده

Regularization has played a key role in deriving sensible estimators in high dimensional statistical inference. A substantial amount of recent works has argued for nonconvex regularizers in favor of their superior theoretical properties and excellent practical performances. In a different but analogous vein, nonconvex loss functions are promoted because of their robustness against “outliers”. However, these nonconvex formulations are computationally more challenging, especially in the presence of nonsmoothness and nonseparability. To address this issue, we propose a new proximal gradient meta-algorithm by rigorously extending the proximal average to the nonconvex setting. We formally prove its nice convergence properties, and illustrate its effectiveness on two applications: multi-task graph-guided fused lasso and robust support vector machines. Experiments demonstrate that our method compares favorably against other alternatives.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimizing Nonconvex Non-Separable Functions Supplement for Minimizing Nonconvex and Non-separable Functions

v ∈ ∂f(w) ⇐⇒ ∃wn → w, f(wn)→ f(w),vn ∈ ∂̂f(wn),vn → v. Clearly, ∂̂f(w) ⊆ ∂f(w) for all w. If f is (resp. continuously) differentiable at w, then ∂̂f(w) (resp. ∂f(w)) coincides with the usual derivative. From the definition it follows that if w is a local minimizer, then 0 ∈ ∂̂f(w) and 0 ∈ ∂f(w), which generalizes the familiar Fermat’s rule. In the main text, we are interested in finding some w so t...

متن کامل

Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization

The problem of minimizing sum-of-nonconvex functions (i.e., convex functions that are average of non-convex ones) is becoming increasingly important in machine learning, and is the core machinery for PCA, SVD, regularized Newton’s method, accelerated non-convex optimization, and more. We show how to provably obtain an accelerated stochastic algorithm for minimizing sumof-nonconvex functions, by...

متن کامل

Nonlocal Patch-based Image Inpainting through Minimization of a Sparsity Promoting Nonconvex Functional

We propose a convex model for nonlocal image inpainting that fills in missing or damaged areas with different convex combinations of available image patches. The visual quality of inpainted solutions is much better when patches are copied instead of averaged, and we show how this can be achieved by adding a nonconvex sparsity promoting penalty. To promote sparsity of a variable that is constrai...

متن کامل

On Minimizing Quadratically Constrained Ratio of Two Quadratic Functions

We consider the nonconvex problem minimizing the ratio of two quadratic functions over finitely many nonconvex quadratic inequalities. Relying on the homogenization technique we establish a sufficient condition that warrants the attainment of an optimal solution. Our result allows to extend and recover known conditions for some interesting special instances of the problem and to derive further ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015